Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Sparse mixture of experts
# Sparse mixture of experts
Turkish Deepseek
Apache-2.0
A language model trained on Turkish text based on the DeepSeek architecture, incorporating Multi-Head Latent Attention (MLA) and Mixture of Experts (MoE) technologies.
Large Language Model
Transformers
Other
T
alibayram
106
1
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase